|
:''This article deals with a subset of the intellectual process of intelligence analysis itself, as opposed to intelligence analysis management, which in turn is a subcomponent of intelligence cycle management. For a complete hierarchical list of articles in this series, see the intelligence cycle management hierarchy.'' Intelligence analysis is plagued by many of the cognitive traps also encountered in other disciplines. The first systematic study of the specific pitfalls lying between an intelligence analyst and clear thinking was carried out by Dick Heuer. According to Heuer, these traps may be rooted either in the analyst's organizational culture or his or her own personality. ==Types== The most common personality trap, known as ''mirror-imaging'' is the analysts' assumption that the people being studied think like the analysts themselves. An important variation is to confuse actual subjects with one's information or images about them, as the sort of apple one eats and the ideas and issues it may raise. It poses a dilemma for the scientific method in general, since science uses information and theory to represent complex natural systems as if theoretical constructs might be in control of indefinable natural processes. An inability to distinguish subjects from what one is thinking about them is also studied under the subject of functional fixedness, first studied in Gestalt psychology and in relation to the subject-object problem. Experienced analysts may recognize that they have fallen prey to mirror-imaging if they discover that they are unwilling to examine variants of what they consider most reasonable in light of their personal frame of reference. Less-perceptive analysts affected by this trap may regard legitimate objections as a personal attack, rather than looking beyond ego to the merits of the question. Peer review (especially by people from a different background) can be a wise safeguard. Organizational culture can also create traps which render individual analysts unwilling to challenge acknowledged experts in the group. Another trap, ''target fixation'', has an analogy in aviation: it occurs when pilots become so intent on delivering their ordnance that they lose sight of the big picture and crash into the target. This is a more basic human tendency than many realize. Analysts may fixate on one hypothesis, looking only at evidence that is consistent with their preconceptions and ignoring other relevant views. The desire for rapid closure is another form of idea fixation. "Familiarity with terrorist methods, repeated attacks against U.S. facilities overseas, combined with indications that the continental United States was at the top of the terrorist target list might have alerted us that we were in peril of a significant attack. And yet, for reasons those who study intelligence failure will find familiar, 9/11 fits very much into the norm of surprise caused by a breakdown of intelligence warning." The breakdown happened, in part, because there was poor information-sharing among analysts (in different FBI offices, for example). At a conceptual level, US intelligence knew that al-Qaida actions almost always involve multiple, near-simultaneous attacks; however, the FBI did not assimilate piecemeal information on oddly behaving foreign flight-training students into this context. On the day of the hijackings (under tremendous time pressure), no analyst associated the multiple hijackings with the multiple-attack signature of al-Qaeda. The failure to conceive that a major attack could occur within the US left the country unprepared. For example, irregularities detected by the Federal Aviation Administration and North American Air Defense Command did not flow into a center where analysts could consolidate this information and (ideally) collate it with earlier reports of odd behavior among certain pilot trainees, or the possibility of hijacked airliners being used as weapons. Inappropriate analogies are yet another cognitive trap. Though analogies may be extremely useful they can become dangerous when forced, or when they are based on assumptions of cultural or contextual equivalence. Avoiding such analogies is difficult when analysts are merely unconscious of differences between their own context and that of others; it becomes extremely difficult when they are unaware that important knowledge is missing. Difficulties associated with admitting one's ignorance are an additional barrier to avoiding such traps. Such ignorance can take the form of insufficient study: a lack of factual information or understanding; an inability to mesh new facts with old; or a simple denial of conflicting facts. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Cognitive traps for intelligence analysis」の詳細全文を読む スポンサード リンク
|